Hi all, Thank you for the usefull tutorial. It is really helpful. Dear aws users, please pay attention that when you create the server (transferable family), you will start to be charged per hour, even if you have free 12 month trial perion.
2022: When creating a role it will ask you for an use case, there's gonna be two radio buttons; ec2 and lambda. Below them there's a dropdown for "use case for other aws services" in there select "Transfer" and that's how i got it working, the policies remained the same as in the video AWS seems kind of complex and this doesn't guarantee it will work for you but i wanted to post what worked for me in case you land in a similar scenario
Having an issue assuming the user role when authenticating. Not sure if something changed in regards to the IAM polices that need to be assigned...and of course my company doesn't have any AWS contracts so I can't contact AWS to see what's up. Lovely
hi! thanks for this video! A lot :) I had to do this SFTP stuff using Terraform, and it makes no sense to use Terraform till you know what has to be done in the cloud, so this video made some sense!
Can you explain why you didn't use a password for connecting to the server in Cyberduck? Also, I didn't see an option to create a password in AWS transfer family. Thank you.
If you auto-generate a policy then what is the point of creating one before? When I try to Auto-generate I get an error "Failed to edit user details (${transfer:Home*} variable used in policy for a user with a logical home directory)"
I think you are getting that error because the SFTP service has been updated to support chroot and logical directory. You don't need to create the user policy anymore. More information here: aws.amazon.com/blogs/storage/simplify-your-aws-sftp-structure-with-chroot-and-logical-directories/
Can we use the AWS Transfer family as an ftp/transfer mechanism to move files between a 3rd party (or supplier) endpoint and an on-premise application (rather than the S3)- thinking of using the AWS ftp service as a middleware/integration layer?
Hey there, great video and excellent information :) Can we send files from LINUX machine to AWS S3 bucket using sftp command line thru similar setup? if yes, then could you please help me with its syntax?
Hey, thanks for the tutorial. I've got a question for you, I used route 53 to set up a custom hostname, but when I use this custom name instead of the endpoint in filezilla the connection is refused because the host does not exist. Any idea what I am missing? Thanks!
I have done everything, but still get following error Couldn't stat remote file: Permission denied Is there any latency for the policy/role to get reflected in sftp or it is instant
I think there was some policy issue. I copy paste the policy again from the official site and it worked fine. I now need to understand how can automate it if we want to give access to client who is going to download and upload files quite often. What would be the best practice?
I think the best practice is to ask the client to give you a public key and you create a user from Transfer Family with that public key so the client can download/upload as needed
Hi Enrico, a noob question in here. what about the access of the customers? how are they gonna be able to transfer files? would it be via cyberduck as well?
Hi, great video :-) I just follow all the steps one by one and at the end Filezilla can connect to the SFTP server but I cannot list,read,write there so the folder looks empty. Any Idea why? { "Version": "2012-10-17", "Statement": [ { "Sid": "ReadWriteS3", "Action": [ "s3:ListBucket", "s3:GetBucketLocation" ], "Effect": "Allow", "Resource": [ "arn:aws:s3:::s3bucketname" ] }, { "Effect": "Allow", "Action": [ "s3:PutObject", "s3:GetObject", "s3:DeleteObject", "s3:DeleteObjectVersion", "s3:GetObjectVersion", "s3:GetObjectACL", "s3:PutObjectACL" ], "Resource": [ "arn:aws:s3:::s3bucketname/*" ], "Sid": "" } ] }
You have some issues on the IAM file for sure. Can you check the policy of the user you use for the SFTP connection? Also make sure you have configured the home directory when you create the SFTP user
Hi all,
Thank you for the usefull tutorial. It is really helpful. Dear aws users, please pay attention that when you create the server (transferable family), you will start to be charged per hour, even if you have free 12 month trial perion.
you nailed it! Thanks for walking through this so concisely!
Excellent tutorial. Instructions and your voice are clear. Thank you so much!
to the point and cover all the things perfect
2022: When creating a role it will ask you for an use case, there's gonna be two radio buttons; ec2 and lambda. Below them there's a dropdown for "use case for other aws services" in there select "Transfer" and that's how i got it working, the policies remained the same as in the video
AWS seems kind of complex and this doesn't guarantee it will work for you but i wanted to post what worked for me in case you land in a similar scenario
Having an issue assuming the user role when authenticating.
Not sure if something changed in regards to the IAM polices that need to be assigned...and of course my company doesn't have any AWS contracts so I can't contact AWS to see what's up.
Lovely
hi! thanks for this video! A lot :) I had to do this SFTP stuff using Terraform, and it makes no sense to use Terraform till you know what has to be done in the cloud, so this video made some sense!
Thank you for producing this. Very useful.
Great tut... thanks... instrucions are clear and easy to follow.
Instructions are very clear. Thank you for the video
Muchas gracias por tu aporte. Sigue asi, me has ayudado mucho. Un saludo desde Ecuador!!!
Excellent tutorial. Keep it up. Thanks.
thank u Enrico! your tutorial was really helpful
Thanks!
Excellent tutorial. Thanks
Super well explained video. Thanks.
Fantastic tutorial!! Thank you so much!!
Thank you very much for this Enrico!
Excellent video, this will save me a lot of time!
Thanks for the video. Your video and explanation are good but the volume needs to be increased.
This tutorial was very helpful! Thank you so much!
Can you explain why you didn't use a password for connecting to the server in Cyberduck? Also, I didn't see an option to create a password in AWS transfer family. Thank you.
If you auto-generate a policy then what is the point of creating one before? When I try to Auto-generate I get an error "Failed to edit user details (${transfer:Home*} variable used in policy for a user with a logical home directory)"
Thanks for the comment. There are two different roles needed: one for the SFTP server to access the S3 bucket and one to scope-down the policy
I think you are getting that error because the SFTP service has been updated to support chroot and logical directory. You don't need to create the user policy anymore. More information here: aws.amazon.com/blogs/storage/simplify-your-aws-sftp-structure-with-chroot-and-logical-directories/
@@EnricoPortolan I just used policy none and created the user correctly, going to test it out.
Thanks dude! 🎉
This is very helpful, thanks a lot! :)
That was very useful, ty.
Very helpful. Thank you
Followed it all EXACT and got an Access Denied when trying to log in...
Nice and clear. Thank you!
great tutorial, thanks!
Genial!
Amazing man!
thanks. its was really helpfull
Great video!!
In the SFTP server creation steps I am getting stuck, as its asking for Workflow, don't know how to fill that?
Good one!
Can we use the AWS Transfer family as an ftp/transfer mechanism to move files between a 3rd party (or supplier) endpoint and an on-premise application (rather than the S3)- thinking of using the AWS ftp service as a middleware/integration layer?
Hey there, great video and excellent information :)
Can we send files from LINUX machine to AWS S3 bucket using sftp command line thru similar setup? if yes, then could you please help me with its syntax?
would like to see more
I would use bucketName/folderName/* instead for the resource name for additional security
Hey, thanks for the tutorial. I've got a question for you, I used route 53 to set up a custom hostname, but when I use this custom name instead of the endpoint in filezilla the connection is refused because the host does not exist. Any idea what I am missing? Thanks!
you need to add a CNAME. More info: docs.aws.amazon.com/transfer/latest/userguide/requirements-dns.html
@@EnricoPortolan Thank you mate, I appreciate you taking the time to help out.
I have done everything, but still get following error
Couldn't stat remote file: Permission denied
Is there any latency for the policy/role to get reflected in sftp or it is instant
see if the bucket is encrypted. If you follow the exact step , it should work.
plus that link in your bio doesn't work
you should've posted the policy here.
Hi bro, can you please help me with this. It is not working for me. How can I connect with you?
I think there was some policy issue. I copy paste the policy again from the official site and it worked fine. I now need to understand how can automate it if we want to give access to client who is going to download and upload files quite often. What would be the best practice?
I think the best practice is to ask the client to give you a public key and you create a user from Transfer Family with that public key so the client can download/upload as needed
@@EnricoPortolan In our scenario, data and storage would be within our infrastructure. Data need to download, modified and upload to our s3 bucket.
I need to add the json of the policy :(
Can we use only 1 SFTP Server for different/multiple File Transfer Pipelines ?
Yes of course as it’s backed by an S3 bucket
@@EnricoPortolan Does that mean I can use only One SFTP server with about 10 different pipelines ?
Pointing to 10 different S3 buckets ?
How can I reach aws sftp server over the over Internet
you can set the sftp server with a public URL as shown in the video
7:38 connect ftp with s3
Good video, except for the continuous lip smacking...
Hi Enrico, a noob question in here. what about the access of the customers? how are they gonna be able to transfer files? would it be via cyberduck as well?
yes exactly, cyberduck would work or any other FTP Client
Hi, great video :-)
I just follow all the steps one by one and at the end Filezilla can connect to the SFTP server but I cannot list,read,write there so the folder looks empty.
Any Idea why?
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ReadWriteS3",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::s3bucketname"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:DeleteObjectVersion",
"s3:GetObjectVersion",
"s3:GetObjectACL",
"s3:PutObjectACL"
],
"Resource": [
"arn:aws:s3:::s3bucketname/*"
],
"Sid": ""
}
]
}
You have some issues on the IAM file for sure. Can you check the policy of the user you use for the SFTP connection? Also make sure you have configured the home directory when you create the SFTP user
@@EnricoPortolan I've forgotten to setup the home directory for the user, Works fine now. so simple solution, Thanks.
@@Digitronus happy to help 🎉